feat:Update /responses OpenAPI spec to make model and input fields optional#180
feat:Update /responses OpenAPI spec to make model and input fields optional#180
Conversation
WalkthroughThe OpenAPI specification for the Changes
Poem
✨ Finishing Touches🧪 Generate Unit Tests
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Actionable comments posted: 1
🧹 Nitpick comments (1)
src/libs/tryAGI.OpenAI/openapi.yaml (1)
11783-11789: Verify OpenAPI schema changes and defaultsAn inline
type: objectblock was added under theallOf, and therequiredarray formodel/inputwas dropped. Please:
- Run this through an OpenAPI linter/validator to ensure the
allOfmerge is valid.- Document or set defaults for now-optional
modelandinputfields in the schema or API description.- Double-check YAML indentation and hyphens around the newly inserted inline schema to avoid syntax issues.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
src/libs/tryAGI.OpenAI/Generated/tryAGI.OpenAI.Models.CreateResponseVariant3.g.csis excluded by!**/generated/**
📒 Files selected for processing (1)
src/libs/tryAGI.OpenAI/openapi.yaml(2 hunks)
⏰ Context from checks skipped due to timeout of 90000ms (1)
- GitHub Check: Test / Build, test and publish
| request: | ||
| curl: "curl https://api.openai.com/v1/responses \\\n -H \"Content-Type: application/json\" \\\n -H \"Authorization: Bearer $OPENAI_API_KEY\" \\\n -d '{\n \"model\": \"gpt-4.1\",\n \"input\": \"Tell me a three sentence bedtime story about a unicorn.\"\n }'\n" | ||
| javascript: "import OpenAI from \"openai\";\n\nconst openai = new OpenAI();\n\nconst response = await openai.responses.create({\n model: \"gpt-4.1\",\n input: \"Tell me a three sentence bedtime story about a unicorn.\"\n});\n\nconsole.log(response);\n" | ||
| python: "import os\nfrom openai import OpenAI\n\nclient = OpenAI(\n api_key=os.environ.get(\"OPENAI_API_KEY\"), # This is the default and can be omitted\n)\nresponse = client.responses.create(\n input=\"string\",\n model=\"gpt-4o\",\n)\nprint(response.id)" | ||
| python: "import os\nfrom openai import OpenAI\n\nclient = OpenAI(\n api_key=os.environ.get(\"OPENAI_API_KEY\"), # This is the default and can be omitted\n)\nresponse = client.responses.create()\nprint(response.id)" | ||
| csharp: "using System;\nusing OpenAI.Responses;\n\nOpenAIResponseClient client = new(\n model: \"gpt-4.1\",\n apiKey: Environment.GetEnvironmentVariable(\"OPENAI_API_KEY\")\n);\n\nOpenAIResponse response = client.CreateResponse(\"Tell me a three sentence bedtime story about a unicorn.\");\n\nConsole.WriteLine(response.GetOutputText());\n" | ||
| node.js: "import OpenAI from 'openai';\n\nconst client = new OpenAI({\n apiKey: process.env['OPENAI_API_KEY'], // This is the default and can be omitted\n});\n\nconst response = await client.responses.create({ input: 'string', model: 'gpt-4o' });\n\nconsole.log(response.id);" | ||
| go: "package main\n\nimport (\n \"context\"\n \"fmt\"\n\n \"github.com/openai/openai-go\"\n \"github.com/openai/openai-go/option\"\n \"github.com/openai/openai-go/responses\"\n \"github.com/openai/openai-go/shared\"\n)\n\nfunc main() {\n client := openai.NewClient(\n option.WithAPIKey(\"My API Key\"), // defaults to os.LookupEnv(\"OPENAI_API_KEY\")\n )\n response, err := client.Responses.New(context.TODO(), responses.ResponseNewParams{\n Input: responses.ResponseNewParamsInputUnion{\n OfString: openai.String(\"string\"),\n },\n Model: shared.ResponsesModel(\"gpt-4o\"),\n })\n if err != nil {\n panic(err.Error())\n }\n fmt.Printf(\"%+v\\n\", response.ID)\n}\n" | ||
| java: "package com.openai.example;\n\nimport com.openai.client.OpenAIClient;\nimport com.openai.client.okhttp.OpenAIOkHttpClient;\nimport com.openai.models.ChatModel;\nimport com.openai.models.responses.Response;\nimport com.openai.models.responses.ResponseCreateParams;\n\npublic final class Main {\n private Main() {}\n\n public static void main(String[] args) {\n // Configures using the `OPENAI_API_KEY`, `OPENAI_ORG_ID`, `OPENAI_PROJECT_ID` and `OPENAI_BASE_URL` environment variables\n OpenAIClient client = OpenAIOkHttpClient.fromEnv();\n\n ResponseCreateParams params = ResponseCreateParams.builder()\n .input(\"string\")\n .model(ChatModel.GPT_4O)\n .build();\n Response response = client.responses().create(params);\n }\n}" | ||
| kotlin: "package com.openai.example\n\nimport com.openai.client.OpenAIClient\nimport com.openai.client.okhttp.OpenAIOkHttpClient\nimport com.openai.models.ChatModel\nimport com.openai.models.responses.Response\nimport com.openai.models.responses.ResponseCreateParams\n\nfun main() {\n // Configures using the `OPENAI_API_KEY`, `OPENAI_ORG_ID`, `OPENAI_PROJECT_ID` and `OPENAI_BASE_URL` environment variables\n val client: OpenAIClient = OpenAIOkHttpClient.fromEnv()\n\n val params: ResponseCreateParams = ResponseCreateParams.builder()\n .input(\"string\")\n .model(ChatModel.GPT_4O)\n .build()\n val response: Response = client.responses().create(params)\n}" | ||
| ruby: "require \"openai\"\n\nopenai = OpenAI::Client.new(\n api_key: ENV[\"OPENAI_API_KEY\"] # This is the default and can be omitted\n)\n\nresponse = openai.responses.create(input: \"string\", model: :\"gpt-4o\")\n\nputs(response)" | ||
| node.js: "import OpenAI from 'openai';\n\nconst client = new OpenAI({\n apiKey: process.env['OPENAI_API_KEY'], // This is the default and can be omitted\n});\n\nconst response = await client.responses.create();\n\nconsole.log(response.id);" | ||
| go: "package main\n\nimport (\n \"context\"\n \"fmt\"\n\n \"github.com/openai/openai-go\"\n \"github.com/openai/openai-go/option\"\n \"github.com/openai/openai-go/responses\"\n)\n\nfunc main() {\n client := openai.NewClient(\n option.WithAPIKey(\"My API Key\"), // defaults to os.LookupEnv(\"OPENAI_API_KEY\")\n )\n response, err := client.Responses.New(context.TODO(), responses.ResponseNewParams{\n\n })\n if err != nil {\n panic(err.Error())\n }\n fmt.Printf(\"%+v\\n\", response.ID)\n}\n" | ||
| java: "package com.openai.example;\n\nimport com.openai.client.OpenAIClient;\nimport com.openai.client.okhttp.OpenAIOkHttpClient;\nimport com.openai.models.responses.Response;\nimport com.openai.models.responses.ResponseCreateParams;\n\npublic final class Main {\n private Main() {}\n\n public static void main(String[] args) {\n // Configures using the `OPENAI_API_KEY`, `OPENAI_ORG_ID`, `OPENAI_PROJECT_ID` and `OPENAI_BASE_URL` environment variables\n OpenAIClient client = OpenAIOkHttpClient.fromEnv();\n\n Response response = client.responses().create();\n }\n}" | ||
| kotlin: "package com.openai.example\n\nimport com.openai.client.OpenAIClient\nimport com.openai.client.okhttp.OpenAIOkHttpClient\nimport com.openai.models.responses.Response\nimport com.openai.models.responses.ResponseCreateParams\n\nfun main() {\n // Configures using the `OPENAI_API_KEY`, `OPENAI_ORG_ID`, `OPENAI_PROJECT_ID` and `OPENAI_BASE_URL` environment variables\n val client: OpenAIClient = OpenAIOkHttpClient.fromEnv()\n\n val response: Response = client.responses().create()\n}" | ||
| ruby: "require \"openai\"\n\nopenai = OpenAI::Client.new(\n api_key: ENV[\"OPENAI_API_KEY\"] # This is the default and can be omitted\n)\n\nresponse = openai.responses.create\n\nputs(response)" | ||
| response: "{\n \"id\": \"resp_67ccd2bed1ec8190b14f964abc0542670bb6a6b452d3795b\",\n \"object\": \"response\",\n \"created_at\": 1741476542,\n \"status\": \"completed\",\n \"error\": null,\n \"incomplete_details\": null,\n \"instructions\": null,\n \"max_output_tokens\": null,\n \"model\": \"gpt-4.1-2025-04-14\",\n \"output\": [\n {\n \"type\": \"message\",\n \"id\": \"msg_67ccd2bf17f0819081ff3bb2cf6508e60bb6a6b452d3795b\",\n \"status\": \"completed\",\n \"role\": \"assistant\",\n \"content\": [\n {\n \"type\": \"output_text\",\n \"text\": \"In a peaceful grove beneath a silver moon, a unicorn named Lumina discovered a hidden pool that reflected the stars. As she dipped her horn into the water, the pool began to shimmer, revealing a pathway to a magical realm of endless night skies. Filled with wonder, Lumina whispered a wish for all who dream to find their own hidden magic, and as she glanced back, her hoofprints sparkled like stardust.\",\n \"annotations\": []\n }\n ]\n }\n ],\n \"parallel_tool_calls\": true,\n \"previous_response_id\": null,\n \"reasoning\": {\n \"effort\": null,\n \"summary\": null\n },\n \"store\": true,\n \"temperature\": 1.0,\n \"text\": {\n \"format\": {\n \"type\": \"text\"\n }\n },\n \"tool_choice\": \"auto\",\n \"tools\": [],\n \"top_p\": 1.0,\n \"truncation\": \"disabled\",\n \"usage\": {\n \"input_tokens\": 36,\n \"input_tokens_details\": {\n \"cached_tokens\": 0\n },\n \"output_tokens\": 87,\n \"output_tokens_details\": {\n \"reasoning_tokens\": 0\n },\n \"total_tokens\": 123\n },\n \"user\": null,\n \"metadata\": {}\n}\n" | ||
| - title: Image input | ||
| request: | ||
| curl: "curl https://api.openai.com/v1/responses \\\n -H \"Content-Type: application/json\" \\\n -H \"Authorization: Bearer $OPENAI_API_KEY\" \\\n -d '{\n \"model\": \"gpt-4.1\",\n \"input\": [\n {\n \"role\": \"user\",\n \"content\": [\n {\"type\": \"input_text\", \"text\": \"what is in this image?\"},\n {\n \"type\": \"input_image\",\n \"image_url\": \"https://upload.wikimedia.org/wikipedia/commons/thumb/d/dd/Gfp-wisconsin-madison-the-nature-boardwalk.jpg/2560px-Gfp-wisconsin-madison-the-nature-boardwalk.jpg\"\n }\n ]\n }\n ]\n }'\n" | ||
| javascript: "import OpenAI from \"openai\";\n\nconst openai = new OpenAI();\n\nconst response = await openai.responses.create({\n model: \"gpt-4.1\",\n input: [\n {\n role: \"user\",\n content: [\n { type: \"input_text\", text: \"what is in this image?\" },\n {\n type: \"input_image\",\n image_url:\n \"https://upload.wikimedia.org/wikipedia/commons/thumb/d/dd/Gfp-wisconsin-madison-the-nature-boardwalk.jpg/2560px-Gfp-wisconsin-madison-the-nature-boardwalk.jpg\",\n },\n ],\n },\n ],\n});\n\nconsole.log(response);\n" | ||
| python: "import os\nfrom openai import OpenAI\n\nclient = OpenAI(\n api_key=os.environ.get(\"OPENAI_API_KEY\"), # This is the default and can be omitted\n)\nresponse = client.responses.create(\n input=\"string\",\n model=\"gpt-4o\",\n)\nprint(response.id)" | ||
| python: "import os\nfrom openai import OpenAI\n\nclient = OpenAI(\n api_key=os.environ.get(\"OPENAI_API_KEY\"), # This is the default and can be omitted\n)\nresponse = client.responses.create()\nprint(response.id)" | ||
| csharp: "using System;\nusing System.Collections.Generic;\n\nusing OpenAI.Responses;\n\nOpenAIResponseClient client = new(\n model: \"gpt-4.1\",\n apiKey: Environment.GetEnvironmentVariable(\"OPENAI_API_KEY\")\n);\n\nList<ResponseItem> inputItems =\n[\n ResponseItem.CreateUserMessageItem(\n [\n ResponseContentPart.CreateInputTextPart(\"What is in this image?\"),\n ResponseContentPart.CreateInputImagePart(new Uri(\"https://upload.wikimedia.org/wikipedia/commons/thumb/d/dd/Gfp-wisconsin-madison-the-nature-boardwalk.jpg/2560px-Gfp-wisconsin-madison-the-nature-boardwalk.jpg\"))\n ]\n )\n];\n\nOpenAIResponse response = client.CreateResponse(inputItems);\n\nConsole.WriteLine(response.GetOutputText());\n" | ||
| node.js: "import OpenAI from 'openai';\n\nconst client = new OpenAI({\n apiKey: process.env['OPENAI_API_KEY'], // This is the default and can be omitted\n});\n\nconst response = await client.responses.create({ input: 'string', model: 'gpt-4o' });\n\nconsole.log(response.id);" | ||
| go: "package main\n\nimport (\n \"context\"\n \"fmt\"\n\n \"github.com/openai/openai-go\"\n \"github.com/openai/openai-go/option\"\n \"github.com/openai/openai-go/responses\"\n \"github.com/openai/openai-go/shared\"\n)\n\nfunc main() {\n client := openai.NewClient(\n option.WithAPIKey(\"My API Key\"), // defaults to os.LookupEnv(\"OPENAI_API_KEY\")\n )\n response, err := client.Responses.New(context.TODO(), responses.ResponseNewParams{\n Input: responses.ResponseNewParamsInputUnion{\n OfString: openai.String(\"string\"),\n },\n Model: shared.ResponsesModel(\"gpt-4o\"),\n })\n if err != nil {\n panic(err.Error())\n }\n fmt.Printf(\"%+v\\n\", response.ID)\n}\n" | ||
| java: "package com.openai.example;\n\nimport com.openai.client.OpenAIClient;\nimport com.openai.client.okhttp.OpenAIOkHttpClient;\nimport com.openai.models.ChatModel;\nimport com.openai.models.responses.Response;\nimport com.openai.models.responses.ResponseCreateParams;\n\npublic final class Main {\n private Main() {}\n\n public static void main(String[] args) {\n // Configures using the `OPENAI_API_KEY`, `OPENAI_ORG_ID`, `OPENAI_PROJECT_ID` and `OPENAI_BASE_URL` environment variables\n OpenAIClient client = OpenAIOkHttpClient.fromEnv();\n\n ResponseCreateParams params = ResponseCreateParams.builder()\n .input(\"string\")\n .model(ChatModel.GPT_4O)\n .build();\n Response response = client.responses().create(params);\n }\n}" | ||
| kotlin: "package com.openai.example\n\nimport com.openai.client.OpenAIClient\nimport com.openai.client.okhttp.OpenAIOkHttpClient\nimport com.openai.models.ChatModel\nimport com.openai.models.responses.Response\nimport com.openai.models.responses.ResponseCreateParams\n\nfun main() {\n // Configures using the `OPENAI_API_KEY`, `OPENAI_ORG_ID`, `OPENAI_PROJECT_ID` and `OPENAI_BASE_URL` environment variables\n val client: OpenAIClient = OpenAIOkHttpClient.fromEnv()\n\n val params: ResponseCreateParams = ResponseCreateParams.builder()\n .input(\"string\")\n .model(ChatModel.GPT_4O)\n .build()\n val response: Response = client.responses().create(params)\n}" | ||
| ruby: "require \"openai\"\n\nopenai = OpenAI::Client.new(\n api_key: ENV[\"OPENAI_API_KEY\"] # This is the default and can be omitted\n)\n\nresponse = openai.responses.create(input: \"string\", model: :\"gpt-4o\")\n\nputs(response)" | ||
| node.js: "import OpenAI from 'openai';\n\nconst client = new OpenAI({\n apiKey: process.env['OPENAI_API_KEY'], // This is the default and can be omitted\n});\n\nconst response = await client.responses.create();\n\nconsole.log(response.id);" | ||
| go: "package main\n\nimport (\n \"context\"\n \"fmt\"\n\n \"github.com/openai/openai-go\"\n \"github.com/openai/openai-go/option\"\n \"github.com/openai/openai-go/responses\"\n)\n\nfunc main() {\n client := openai.NewClient(\n option.WithAPIKey(\"My API Key\"), // defaults to os.LookupEnv(\"OPENAI_API_KEY\")\n )\n response, err := client.Responses.New(context.TODO(), responses.ResponseNewParams{\n\n })\n if err != nil {\n panic(err.Error())\n }\n fmt.Printf(\"%+v\\n\", response.ID)\n}\n" | ||
| java: "package com.openai.example;\n\nimport com.openai.client.OpenAIClient;\nimport com.openai.client.okhttp.OpenAIOkHttpClient;\nimport com.openai.models.responses.Response;\nimport com.openai.models.responses.ResponseCreateParams;\n\npublic final class Main {\n private Main() {}\n\n public static void main(String[] args) {\n // Configures using the `OPENAI_API_KEY`, `OPENAI_ORG_ID`, `OPENAI_PROJECT_ID` and `OPENAI_BASE_URL` environment variables\n OpenAIClient client = OpenAIOkHttpClient.fromEnv();\n\n Response response = client.responses().create();\n }\n}" | ||
| kotlin: "package com.openai.example\n\nimport com.openai.client.OpenAIClient\nimport com.openai.client.okhttp.OpenAIOkHttpClient\nimport com.openai.models.responses.Response\nimport com.openai.models.responses.ResponseCreateParams\n\nfun main() {\n // Configures using the `OPENAI_API_KEY`, `OPENAI_ORG_ID`, `OPENAI_PROJECT_ID` and `OPENAI_BASE_URL` environment variables\n val client: OpenAIClient = OpenAIOkHttpClient.fromEnv()\n\n val response: Response = client.responses().create()\n}" | ||
| ruby: "require \"openai\"\n\nopenai = OpenAI::Client.new(\n api_key: ENV[\"OPENAI_API_KEY\"] # This is the default and can be omitted\n)\n\nresponse = openai.responses.create\n\nputs(response)" | ||
| response: "{\n \"id\": \"resp_67ccd3a9da748190baa7f1570fe91ac604becb25c45c1d41\",\n \"object\": \"response\",\n \"created_at\": 1741476777,\n \"status\": \"completed\",\n \"error\": null,\n \"incomplete_details\": null,\n \"instructions\": null,\n \"max_output_tokens\": null,\n \"model\": \"gpt-4.1-2025-04-14\",\n \"output\": [\n {\n \"type\": \"message\",\n \"id\": \"msg_67ccd3acc8d48190a77525dc6de64b4104becb25c45c1d41\",\n \"status\": \"completed\",\n \"role\": \"assistant\",\n \"content\": [\n {\n \"type\": \"output_text\",\n \"text\": \"The image depicts a scenic landscape with a wooden boardwalk or pathway leading through lush, green grass under a blue sky with some clouds. The setting suggests a peaceful natural area, possibly a park or nature reserve. There are trees and shrubs in the background.\",\n \"annotations\": []\n }\n ]\n }\n ],\n \"parallel_tool_calls\": true,\n \"previous_response_id\": null,\n \"reasoning\": {\n \"effort\": null,\n \"summary\": null\n },\n \"store\": true,\n \"temperature\": 1.0,\n \"text\": {\n \"format\": {\n \"type\": \"text\"\n }\n },\n \"tool_choice\": \"auto\",\n \"tools\": [],\n \"top_p\": 1.0,\n \"truncation\": \"disabled\",\n \"usage\": {\n \"input_tokens\": 328,\n \"input_tokens_details\": {\n \"cached_tokens\": 0\n },\n \"output_tokens\": 52,\n \"output_tokens_details\": {\n \"reasoning_tokens\": 0\n },\n \"total_tokens\": 380\n },\n \"user\": null,\n \"metadata\": {}\n}\n" | ||
| - title: Web search | ||
| request: | ||
| curl: "curl https://api.openai.com/v1/responses \\\n -H \"Content-Type: application/json\" \\\n -H \"Authorization: Bearer $OPENAI_API_KEY\" \\\n -d '{\n \"model\": \"gpt-4.1\",\n \"tools\": [{ \"type\": \"web_search_preview\" }],\n \"input\": \"What was a positive news story from today?\"\n }'\n" | ||
| javascript: "import OpenAI from \"openai\";\n\nconst openai = new OpenAI();\n\nconst response = await openai.responses.create({\n model: \"gpt-4.1\",\n tools: [{ type: \"web_search_preview\" }],\n input: \"What was a positive news story from today?\",\n});\n\nconsole.log(response);\n" | ||
| python: "import os\nfrom openai import OpenAI\n\nclient = OpenAI(\n api_key=os.environ.get(\"OPENAI_API_KEY\"), # This is the default and can be omitted\n)\nresponse = client.responses.create(\n input=\"string\",\n model=\"gpt-4o\",\n)\nprint(response.id)" | ||
| python: "import os\nfrom openai import OpenAI\n\nclient = OpenAI(\n api_key=os.environ.get(\"OPENAI_API_KEY\"), # This is the default and can be omitted\n)\nresponse = client.responses.create()\nprint(response.id)" | ||
| csharp: "using System;\n\nusing OpenAI.Responses;\n\nOpenAIResponseClient client = new(\n model: \"gpt-4.1\",\n apiKey: Environment.GetEnvironmentVariable(\"OPENAI_API_KEY\")\n);\n\nstring userInputText = \"What was a positive news story from today?\";\n\nResponseCreationOptions options = new()\n{\n Tools =\n {\n ResponseTool.CreateWebSearchTool()\n },\n};\n\nOpenAIResponse response = client.CreateResponse(userInputText, options);\n\nConsole.WriteLine(response.GetOutputText());\n" | ||
| node.js: "import OpenAI from 'openai';\n\nconst client = new OpenAI({\n apiKey: process.env['OPENAI_API_KEY'], // This is the default and can be omitted\n});\n\nconst response = await client.responses.create({ input: 'string', model: 'gpt-4o' });\n\nconsole.log(response.id);" | ||
| go: "package main\n\nimport (\n \"context\"\n \"fmt\"\n\n \"github.com/openai/openai-go\"\n \"github.com/openai/openai-go/option\"\n \"github.com/openai/openai-go/responses\"\n \"github.com/openai/openai-go/shared\"\n)\n\nfunc main() {\n client := openai.NewClient(\n option.WithAPIKey(\"My API Key\"), // defaults to os.LookupEnv(\"OPENAI_API_KEY\")\n )\n response, err := client.Responses.New(context.TODO(), responses.ResponseNewParams{\n Input: responses.ResponseNewParamsInputUnion{\n OfString: openai.String(\"string\"),\n },\n Model: shared.ResponsesModel(\"gpt-4o\"),\n })\n if err != nil {\n panic(err.Error())\n }\n fmt.Printf(\"%+v\\n\", response.ID)\n}\n" | ||
| java: "package com.openai.example;\n\nimport com.openai.client.OpenAIClient;\nimport com.openai.client.okhttp.OpenAIOkHttpClient;\nimport com.openai.models.ChatModel;\nimport com.openai.models.responses.Response;\nimport com.openai.models.responses.ResponseCreateParams;\n\npublic final class Main {\n private Main() {}\n\n public static void main(String[] args) {\n // Configures using the `OPENAI_API_KEY`, `OPENAI_ORG_ID`, `OPENAI_PROJECT_ID` and `OPENAI_BASE_URL` environment variables\n OpenAIClient client = OpenAIOkHttpClient.fromEnv();\n\n ResponseCreateParams params = ResponseCreateParams.builder()\n .input(\"string\")\n .model(ChatModel.GPT_4O)\n .build();\n Response response = client.responses().create(params);\n }\n}" | ||
| kotlin: "package com.openai.example\n\nimport com.openai.client.OpenAIClient\nimport com.openai.client.okhttp.OpenAIOkHttpClient\nimport com.openai.models.ChatModel\nimport com.openai.models.responses.Response\nimport com.openai.models.responses.ResponseCreateParams\n\nfun main() {\n // Configures using the `OPENAI_API_KEY`, `OPENAI_ORG_ID`, `OPENAI_PROJECT_ID` and `OPENAI_BASE_URL` environment variables\n val client: OpenAIClient = OpenAIOkHttpClient.fromEnv()\n\n val params: ResponseCreateParams = ResponseCreateParams.builder()\n .input(\"string\")\n .model(ChatModel.GPT_4O)\n .build()\n val response: Response = client.responses().create(params)\n}" | ||
| ruby: "require \"openai\"\n\nopenai = OpenAI::Client.new(\n api_key: ENV[\"OPENAI_API_KEY\"] # This is the default and can be omitted\n)\n\nresponse = openai.responses.create(input: \"string\", model: :\"gpt-4o\")\n\nputs(response)" | ||
| node.js: "import OpenAI from 'openai';\n\nconst client = new OpenAI({\n apiKey: process.env['OPENAI_API_KEY'], // This is the default and can be omitted\n});\n\nconst response = await client.responses.create();\n\nconsole.log(response.id);" | ||
| go: "package main\n\nimport (\n \"context\"\n \"fmt\"\n\n \"github.com/openai/openai-go\"\n \"github.com/openai/openai-go/option\"\n \"github.com/openai/openai-go/responses\"\n)\n\nfunc main() {\n client := openai.NewClient(\n option.WithAPIKey(\"My API Key\"), // defaults to os.LookupEnv(\"OPENAI_API_KEY\")\n )\n response, err := client.Responses.New(context.TODO(), responses.ResponseNewParams{\n\n })\n if err != nil {\n panic(err.Error())\n }\n fmt.Printf(\"%+v\\n\", response.ID)\n}\n" | ||
| java: "package com.openai.example;\n\nimport com.openai.client.OpenAIClient;\nimport com.openai.client.okhttp.OpenAIOkHttpClient;\nimport com.openai.models.responses.Response;\nimport com.openai.models.responses.ResponseCreateParams;\n\npublic final class Main {\n private Main() {}\n\n public static void main(String[] args) {\n // Configures using the `OPENAI_API_KEY`, `OPENAI_ORG_ID`, `OPENAI_PROJECT_ID` and `OPENAI_BASE_URL` environment variables\n OpenAIClient client = OpenAIOkHttpClient.fromEnv();\n\n Response response = client.responses().create();\n }\n}" | ||
| kotlin: "package com.openai.example\n\nimport com.openai.client.OpenAIClient\nimport com.openai.client.okhttp.OpenAIOkHttpClient\nimport com.openai.models.responses.Response\nimport com.openai.models.responses.ResponseCreateParams\n\nfun main() {\n // Configures using the `OPENAI_API_KEY`, `OPENAI_ORG_ID`, `OPENAI_PROJECT_ID` and `OPENAI_BASE_URL` environment variables\n val client: OpenAIClient = OpenAIOkHttpClient.fromEnv()\n\n val response: Response = client.responses().create()\n}" | ||
| ruby: "require \"openai\"\n\nopenai = OpenAI::Client.new(\n api_key: ENV[\"OPENAI_API_KEY\"] # This is the default and can be omitted\n)\n\nresponse = openai.responses.create\n\nputs(response)" | ||
| response: "{\n \"id\": \"resp_67ccf18ef5fc8190b16dbee19bc54e5f087bb177ab789d5c\",\n \"object\": \"response\",\n \"created_at\": 1741484430,\n \"status\": \"completed\",\n \"error\": null,\n \"incomplete_details\": null,\n \"instructions\": null,\n \"max_output_tokens\": null,\n \"model\": \"gpt-4.1-2025-04-14\",\n \"output\": [\n {\n \"type\": \"web_search_call\",\n \"id\": \"ws_67ccf18f64008190a39b619f4c8455ef087bb177ab789d5c\",\n \"status\": \"completed\"\n },\n {\n \"type\": \"message\",\n \"id\": \"msg_67ccf190ca3881909d433c50b1f6357e087bb177ab789d5c\",\n \"status\": \"completed\",\n \"role\": \"assistant\",\n \"content\": [\n {\n \"type\": \"output_text\",\n \"text\": \"As of today, March 9, 2025, one notable positive news story...\",\n \"annotations\": [\n {\n \"type\": \"url_citation\",\n \"start_index\": 442,\n \"end_index\": 557,\n \"url\": \"https://.../?utm_source=chatgpt.com\",\n \"title\": \"...\"\n },\n {\n \"type\": \"url_citation\",\n \"start_index\": 962,\n \"end_index\": 1077,\n \"url\": \"https://.../?utm_source=chatgpt.com\",\n \"title\": \"...\"\n },\n {\n \"type\": \"url_citation\",\n \"start_index\": 1336,\n \"end_index\": 1451,\n \"url\": \"https://.../?utm_source=chatgpt.com\",\n \"title\": \"...\"\n }\n ]\n }\n ]\n }\n ],\n \"parallel_tool_calls\": true,\n \"previous_response_id\": null,\n \"reasoning\": {\n \"effort\": null,\n \"summary\": null\n },\n \"store\": true,\n \"temperature\": 1.0,\n \"text\": {\n \"format\": {\n \"type\": \"text\"\n }\n },\n \"tool_choice\": \"auto\",\n \"tools\": [\n {\n \"type\": \"web_search_preview\",\n \"domains\": [],\n \"search_context_size\": \"medium\",\n \"user_location\": {\n \"type\": \"approximate\",\n \"city\": null,\n \"country\": \"US\",\n \"region\": null,\n \"timezone\": null\n }\n }\n ],\n \"top_p\": 1.0,\n \"truncation\": \"disabled\",\n \"usage\": {\n \"input_tokens\": 328,\n \"input_tokens_details\": {\n \"cached_tokens\": 0\n },\n \"output_tokens\": 356,\n \"output_tokens_details\": {\n \"reasoning_tokens\": 0\n },\n \"total_tokens\": 684\n },\n \"user\": null,\n \"metadata\": {}\n}\n" | ||
| - title: File search | ||
| request: | ||
| curl: "curl https://api.openai.com/v1/responses \\\n -H \"Content-Type: application/json\" \\\n -H \"Authorization: Bearer $OPENAI_API_KEY\" \\\n -d '{\n \"model\": \"gpt-4.1\",\n \"tools\": [{\n \"type\": \"file_search\",\n \"vector_store_ids\": [\"vs_1234567890\"],\n \"max_num_results\": 20\n }],\n \"input\": \"What are the attributes of an ancient brown dragon?\"\n }'\n" | ||
| javascript: "import OpenAI from \"openai\";\n\nconst openai = new OpenAI();\n\nconst response = await openai.responses.create({\n model: \"gpt-4.1\",\n tools: [{\n type: \"file_search\",\n vector_store_ids: [\"vs_1234567890\"],\n max_num_results: 20\n }],\n input: \"What are the attributes of an ancient brown dragon?\",\n});\n\nconsole.log(response);\n" | ||
| python: "import os\nfrom openai import OpenAI\n\nclient = OpenAI(\n api_key=os.environ.get(\"OPENAI_API_KEY\"), # This is the default and can be omitted\n)\nresponse = client.responses.create(\n input=\"string\",\n model=\"gpt-4o\",\n)\nprint(response.id)" | ||
| python: "import os\nfrom openai import OpenAI\n\nclient = OpenAI(\n api_key=os.environ.get(\"OPENAI_API_KEY\"), # This is the default and can be omitted\n)\nresponse = client.responses.create()\nprint(response.id)" | ||
| csharp: "using System;\n\nusing OpenAI.Responses;\n\nOpenAIResponseClient client = new(\n model: \"gpt-4.1\",\n apiKey: Environment.GetEnvironmentVariable(\"OPENAI_API_KEY\")\n);\n\nstring userInputText = \"What are the attributes of an ancient brown dragon?\";\n\nResponseCreationOptions options = new()\n{\n Tools =\n {\n ResponseTool.CreateFileSearchTool(\n vectorStoreIds: [\"vs_1234567890\"],\n maxResultCount: 20\n )\n },\n};\n\nOpenAIResponse response = client.CreateResponse(userInputText, options);\n\nConsole.WriteLine(response.GetOutputText());\n" | ||
| node.js: "import OpenAI from 'openai';\n\nconst client = new OpenAI({\n apiKey: process.env['OPENAI_API_KEY'], // This is the default and can be omitted\n});\n\nconst response = await client.responses.create({ input: 'string', model: 'gpt-4o' });\n\nconsole.log(response.id);" | ||
| go: "package main\n\nimport (\n \"context\"\n \"fmt\"\n\n \"github.com/openai/openai-go\"\n \"github.com/openai/openai-go/option\"\n \"github.com/openai/openai-go/responses\"\n \"github.com/openai/openai-go/shared\"\n)\n\nfunc main() {\n client := openai.NewClient(\n option.WithAPIKey(\"My API Key\"), // defaults to os.LookupEnv(\"OPENAI_API_KEY\")\n )\n response, err := client.Responses.New(context.TODO(), responses.ResponseNewParams{\n Input: responses.ResponseNewParamsInputUnion{\n OfString: openai.String(\"string\"),\n },\n Model: shared.ResponsesModel(\"gpt-4o\"),\n })\n if err != nil {\n panic(err.Error())\n }\n fmt.Printf(\"%+v\\n\", response.ID)\n}\n" | ||
| java: "package com.openai.example;\n\nimport com.openai.client.OpenAIClient;\nimport com.openai.client.okhttp.OpenAIOkHttpClient;\nimport com.openai.models.ChatModel;\nimport com.openai.models.responses.Response;\nimport com.openai.models.responses.ResponseCreateParams;\n\npublic final class Main {\n private Main() {}\n\n public static void main(String[] args) {\n // Configures using the `OPENAI_API_KEY`, `OPENAI_ORG_ID`, `OPENAI_PROJECT_ID` and `OPENAI_BASE_URL` environment variables\n OpenAIClient client = OpenAIOkHttpClient.fromEnv();\n\n ResponseCreateParams params = ResponseCreateParams.builder()\n .input(\"string\")\n .model(ChatModel.GPT_4O)\n .build();\n Response response = client.responses().create(params);\n }\n}" | ||
| kotlin: "package com.openai.example\n\nimport com.openai.client.OpenAIClient\nimport com.openai.client.okhttp.OpenAIOkHttpClient\nimport com.openai.models.ChatModel\nimport com.openai.models.responses.Response\nimport com.openai.models.responses.ResponseCreateParams\n\nfun main() {\n // Configures using the `OPENAI_API_KEY`, `OPENAI_ORG_ID`, `OPENAI_PROJECT_ID` and `OPENAI_BASE_URL` environment variables\n val client: OpenAIClient = OpenAIOkHttpClient.fromEnv()\n\n val params: ResponseCreateParams = ResponseCreateParams.builder()\n .input(\"string\")\n .model(ChatModel.GPT_4O)\n .build()\n val response: Response = client.responses().create(params)\n}" | ||
| ruby: "require \"openai\"\n\nopenai = OpenAI::Client.new(\n api_key: ENV[\"OPENAI_API_KEY\"] # This is the default and can be omitted\n)\n\nresponse = openai.responses.create(input: \"string\", model: :\"gpt-4o\")\n\nputs(response)" | ||
| node.js: "import OpenAI from 'openai';\n\nconst client = new OpenAI({\n apiKey: process.env['OPENAI_API_KEY'], // This is the default and can be omitted\n});\n\nconst response = await client.responses.create();\n\nconsole.log(response.id);" | ||
| go: "package main\n\nimport (\n \"context\"\n \"fmt\"\n\n \"github.com/openai/openai-go\"\n \"github.com/openai/openai-go/option\"\n \"github.com/openai/openai-go/responses\"\n)\n\nfunc main() {\n client := openai.NewClient(\n option.WithAPIKey(\"My API Key\"), // defaults to os.LookupEnv(\"OPENAI_API_KEY\")\n )\n response, err := client.Responses.New(context.TODO(), responses.ResponseNewParams{\n\n })\n if err != nil {\n panic(err.Error())\n }\n fmt.Printf(\"%+v\\n\", response.ID)\n}\n" | ||
| java: "package com.openai.example;\n\nimport com.openai.client.OpenAIClient;\nimport com.openai.client.okhttp.OpenAIOkHttpClient;\nimport com.openai.models.responses.Response;\nimport com.openai.models.responses.ResponseCreateParams;\n\npublic final class Main {\n private Main() {}\n\n public static void main(String[] args) {\n // Configures using the `OPENAI_API_KEY`, `OPENAI_ORG_ID`, `OPENAI_PROJECT_ID` and `OPENAI_BASE_URL` environment variables\n OpenAIClient client = OpenAIOkHttpClient.fromEnv();\n\n Response response = client.responses().create();\n }\n}" | ||
| kotlin: "package com.openai.example\n\nimport com.openai.client.OpenAIClient\nimport com.openai.client.okhttp.OpenAIOkHttpClient\nimport com.openai.models.responses.Response\nimport com.openai.models.responses.ResponseCreateParams\n\nfun main() {\n // Configures using the `OPENAI_API_KEY`, `OPENAI_ORG_ID`, `OPENAI_PROJECT_ID` and `OPENAI_BASE_URL` environment variables\n val client: OpenAIClient = OpenAIOkHttpClient.fromEnv()\n\n val response: Response = client.responses().create()\n}" | ||
| ruby: "require \"openai\"\n\nopenai = OpenAI::Client.new(\n api_key: ENV[\"OPENAI_API_KEY\"] # This is the default and can be omitted\n)\n\nresponse = openai.responses.create\n\nputs(response)" | ||
| response: "{\n \"id\": \"resp_67ccf4c55fc48190b71bd0463ad3306d09504fb6872380d7\",\n \"object\": \"response\",\n \"created_at\": 1741485253,\n \"status\": \"completed\",\n \"error\": null,\n \"incomplete_details\": null,\n \"instructions\": null,\n \"max_output_tokens\": null,\n \"model\": \"gpt-4.1-2025-04-14\",\n \"output\": [\n {\n \"type\": \"file_search_call\",\n \"id\": \"fs_67ccf4c63cd08190887ef6464ba5681609504fb6872380d7\",\n \"status\": \"completed\",\n \"queries\": [\n \"attributes of an ancient brown dragon\"\n ],\n \"results\": null\n },\n {\n \"type\": \"message\",\n \"id\": \"msg_67ccf4c93e5c81909d595b369351a9d309504fb6872380d7\",\n \"status\": \"completed\",\n \"role\": \"assistant\",\n \"content\": [\n {\n \"type\": \"output_text\",\n \"text\": \"The attributes of an ancient brown dragon include...\",\n \"annotations\": [\n {\n \"type\": \"file_citation\",\n \"index\": 320,\n \"file_id\": \"file-4wDz5b167pAf72nx1h9eiN\",\n \"filename\": \"dragons.pdf\"\n },\n {\n \"type\": \"file_citation\",\n \"index\": 576,\n \"file_id\": \"file-4wDz5b167pAf72nx1h9eiN\",\n \"filename\": \"dragons.pdf\"\n },\n {\n \"type\": \"file_citation\",\n \"index\": 815,\n \"file_id\": \"file-4wDz5b167pAf72nx1h9eiN\",\n \"filename\": \"dragons.pdf\"\n },\n {\n \"type\": \"file_citation\",\n \"index\": 815,\n \"file_id\": \"file-4wDz5b167pAf72nx1h9eiN\",\n \"filename\": \"dragons.pdf\"\n },\n {\n \"type\": \"file_citation\",\n \"index\": 1030,\n \"file_id\": \"file-4wDz5b167pAf72nx1h9eiN\",\n \"filename\": \"dragons.pdf\"\n },\n {\n \"type\": \"file_citation\",\n \"index\": 1030,\n \"file_id\": \"file-4wDz5b167pAf72nx1h9eiN\",\n \"filename\": \"dragons.pdf\"\n },\n {\n \"type\": \"file_citation\",\n \"index\": 1156,\n \"file_id\": \"file-4wDz5b167pAf72nx1h9eiN\",\n \"filename\": \"dragons.pdf\"\n },\n {\n \"type\": \"file_citation\",\n \"index\": 1225,\n \"file_id\": \"file-4wDz5b167pAf72nx1h9eiN\",\n \"filename\": \"dragons.pdf\"\n }\n ]\n }\n ]\n }\n ],\n \"parallel_tool_calls\": true,\n \"previous_response_id\": null,\n \"reasoning\": {\n \"effort\": null,\n \"summary\": null\n },\n \"store\": true,\n \"temperature\": 1.0,\n \"text\": {\n \"format\": {\n \"type\": \"text\"\n }\n },\n \"tool_choice\": \"auto\",\n \"tools\": [\n {\n \"type\": \"file_search\",\n \"filters\": null,\n \"max_num_results\": 20,\n \"ranking_options\": {\n \"ranker\": \"auto\",\n \"score_threshold\": 0.0\n },\n \"vector_store_ids\": [\n \"vs_1234567890\"\n ]\n }\n ],\n \"top_p\": 1.0,\n \"truncation\": \"disabled\",\n \"usage\": {\n \"input_tokens\": 18307,\n \"input_tokens_details\": {\n \"cached_tokens\": 0\n },\n \"output_tokens\": 348,\n \"output_tokens_details\": {\n \"reasoning_tokens\": 0\n },\n \"total_tokens\": 18655\n },\n \"user\": null,\n \"metadata\": {}\n} \n" | ||
| - title: Streaming | ||
| request: | ||
| curl: "curl https://api.openai.com/v1/responses \\\n -H \"Content-Type: application/json\" \\\n -H \"Authorization: Bearer $OPENAI_API_KEY\" \\\n -d '{\n \"model\": \"gpt-4.1\",\n \"instructions\": \"You are a helpful assistant.\",\n \"input\": \"Hello!\",\n \"stream\": true\n }'\n" | ||
| python: "import os\nfrom openai import OpenAI\n\nclient = OpenAI(\n api_key=os.environ.get(\"OPENAI_API_KEY\"), # This is the default and can be omitted\n)\nresponse = client.responses.create(\n input=\"string\",\n model=\"gpt-4o\",\n)\nprint(response.id)" | ||
| python: "import os\nfrom openai import OpenAI\n\nclient = OpenAI(\n api_key=os.environ.get(\"OPENAI_API_KEY\"), # This is the default and can be omitted\n)\nresponse = client.responses.create()\nprint(response.id)" | ||
| javascript: "import OpenAI from \"openai\";\n\nconst openai = new OpenAI();\n\nconst response = await openai.responses.create({\n model: \"gpt-4.1\",\n instructions: \"You are a helpful assistant.\",\n input: \"Hello!\",\n stream: true,\n});\n\nfor await (const event of response) {\n console.log(event);\n}\n" | ||
| csharp: "using System;\nusing System.ClientModel;\nusing System.Threading.Tasks;\n\nusing OpenAI.Responses;\n\nOpenAIResponseClient client = new(\n model: \"gpt-4.1\",\n apiKey: Environment.GetEnvironmentVariable(\"OPENAI_API_KEY\")\n);\n\nstring userInputText = \"Hello!\";\n\nResponseCreationOptions options = new()\n{\n Instructions = \"You are a helpful assistant.\",\n};\n\nAsyncCollectionResult<StreamingResponseUpdate> responseUpdates = client.CreateResponseStreamingAsync(userInputText, options);\n\nawait foreach (StreamingResponseUpdate responseUpdate in responseUpdates)\n{\n if (responseUpdate is StreamingResponseOutputTextDeltaUpdate outputTextDeltaUpdate)\n {\n Console.Write(outputTextDeltaUpdate.Delta);\n }\n}\n" | ||
| node.js: "import OpenAI from 'openai';\n\nconst client = new OpenAI({\n apiKey: process.env['OPENAI_API_KEY'], // This is the default and can be omitted\n});\n\nconst response = await client.responses.create({ input: 'string', model: 'gpt-4o' });\n\nconsole.log(response.id);" | ||
| go: "package main\n\nimport (\n \"context\"\n \"fmt\"\n\n \"github.com/openai/openai-go\"\n \"github.com/openai/openai-go/option\"\n \"github.com/openai/openai-go/responses\"\n \"github.com/openai/openai-go/shared\"\n)\n\nfunc main() {\n client := openai.NewClient(\n option.WithAPIKey(\"My API Key\"), // defaults to os.LookupEnv(\"OPENAI_API_KEY\")\n )\n response, err := client.Responses.New(context.TODO(), responses.ResponseNewParams{\n Input: responses.ResponseNewParamsInputUnion{\n OfString: openai.String(\"string\"),\n },\n Model: shared.ResponsesModel(\"gpt-4o\"),\n })\n if err != nil {\n panic(err.Error())\n }\n fmt.Printf(\"%+v\\n\", response.ID)\n}\n" | ||
| java: "package com.openai.example;\n\nimport com.openai.client.OpenAIClient;\nimport com.openai.client.okhttp.OpenAIOkHttpClient;\nimport com.openai.models.ChatModel;\nimport com.openai.models.responses.Response;\nimport com.openai.models.responses.ResponseCreateParams;\n\npublic final class Main {\n private Main() {}\n\n public static void main(String[] args) {\n // Configures using the `OPENAI_API_KEY`, `OPENAI_ORG_ID`, `OPENAI_PROJECT_ID` and `OPENAI_BASE_URL` environment variables\n OpenAIClient client = OpenAIOkHttpClient.fromEnv();\n\n ResponseCreateParams params = ResponseCreateParams.builder()\n .input(\"string\")\n .model(ChatModel.GPT_4O)\n .build();\n Response response = client.responses().create(params);\n }\n}" | ||
| kotlin: "package com.openai.example\n\nimport com.openai.client.OpenAIClient\nimport com.openai.client.okhttp.OpenAIOkHttpClient\nimport com.openai.models.ChatModel\nimport com.openai.models.responses.Response\nimport com.openai.models.responses.ResponseCreateParams\n\nfun main() {\n // Configures using the `OPENAI_API_KEY`, `OPENAI_ORG_ID`, `OPENAI_PROJECT_ID` and `OPENAI_BASE_URL` environment variables\n val client: OpenAIClient = OpenAIOkHttpClient.fromEnv()\n\n val params: ResponseCreateParams = ResponseCreateParams.builder()\n .input(\"string\")\n .model(ChatModel.GPT_4O)\n .build()\n val response: Response = client.responses().create(params)\n}" | ||
| ruby: "require \"openai\"\n\nopenai = OpenAI::Client.new(\n api_key: ENV[\"OPENAI_API_KEY\"] # This is the default and can be omitted\n)\n\nresponse = openai.responses.create(input: \"string\", model: :\"gpt-4o\")\n\nputs(response)" | ||
| node.js: "import OpenAI from 'openai';\n\nconst client = new OpenAI({\n apiKey: process.env['OPENAI_API_KEY'], // This is the default and can be omitted\n});\n\nconst response = await client.responses.create();\n\nconsole.log(response.id);" | ||
| go: "package main\n\nimport (\n \"context\"\n \"fmt\"\n\n \"github.com/openai/openai-go\"\n \"github.com/openai/openai-go/option\"\n \"github.com/openai/openai-go/responses\"\n)\n\nfunc main() {\n client := openai.NewClient(\n option.WithAPIKey(\"My API Key\"), // defaults to os.LookupEnv(\"OPENAI_API_KEY\")\n )\n response, err := client.Responses.New(context.TODO(), responses.ResponseNewParams{\n\n })\n if err != nil {\n panic(err.Error())\n }\n fmt.Printf(\"%+v\\n\", response.ID)\n}\n" | ||
| java: "package com.openai.example;\n\nimport com.openai.client.OpenAIClient;\nimport com.openai.client.okhttp.OpenAIOkHttpClient;\nimport com.openai.models.responses.Response;\nimport com.openai.models.responses.ResponseCreateParams;\n\npublic final class Main {\n private Main() {}\n\n public static void main(String[] args) {\n // Configures using the `OPENAI_API_KEY`, `OPENAI_ORG_ID`, `OPENAI_PROJECT_ID` and `OPENAI_BASE_URL` environment variables\n OpenAIClient client = OpenAIOkHttpClient.fromEnv();\n\n Response response = client.responses().create();\n }\n}" | ||
| kotlin: "package com.openai.example\n\nimport com.openai.client.OpenAIClient\nimport com.openai.client.okhttp.OpenAIOkHttpClient\nimport com.openai.models.responses.Response\nimport com.openai.models.responses.ResponseCreateParams\n\nfun main() {\n // Configures using the `OPENAI_API_KEY`, `OPENAI_ORG_ID`, `OPENAI_PROJECT_ID` and `OPENAI_BASE_URL` environment variables\n val client: OpenAIClient = OpenAIOkHttpClient.fromEnv()\n\n val response: Response = client.responses().create()\n}" | ||
| ruby: "require \"openai\"\n\nopenai = OpenAI::Client.new(\n api_key: ENV[\"OPENAI_API_KEY\"] # This is the default and can be omitted\n)\n\nresponse = openai.responses.create\n\nputs(response)" | ||
| response: "event: response.created\ndata: {\"type\":\"response.created\",\"response\":{\"id\":\"resp_67c9fdcecf488190bdd9a0409de3a1ec07b8b0ad4e5eb654\",\"object\":\"response\",\"created_at\":1741290958,\"status\":\"in_progress\",\"error\":null,\"incomplete_details\":null,\"instructions\":\"You are a helpful assistant.\",\"max_output_tokens\":null,\"model\":\"gpt-4.1-2025-04-14\",\"output\":[],\"parallel_tool_calls\":true,\"previous_response_id\":null,\"reasoning\":{\"effort\":null,\"summary\":null},\"store\":true,\"temperature\":1.0,\"text\":{\"format\":{\"type\":\"text\"}},\"tool_choice\":\"auto\",\"tools\":[],\"top_p\":1.0,\"truncation\":\"disabled\",\"usage\":null,\"user\":null,\"metadata\":{}}}\n\nevent: response.in_progress\ndata: {\"type\":\"response.in_progress\",\"response\":{\"id\":\"resp_67c9fdcecf488190bdd9a0409de3a1ec07b8b0ad4e5eb654\",\"object\":\"response\",\"created_at\":1741290958,\"status\":\"in_progress\",\"error\":null,\"incomplete_details\":null,\"instructions\":\"You are a helpful assistant.\",\"max_output_tokens\":null,\"model\":\"gpt-4.1-2025-04-14\",\"output\":[],\"parallel_tool_calls\":true,\"previous_response_id\":null,\"reasoning\":{\"effort\":null,\"summary\":null},\"store\":true,\"temperature\":1.0,\"text\":{\"format\":{\"type\":\"text\"}},\"tool_choice\":\"auto\",\"tools\":[],\"top_p\":1.0,\"truncation\":\"disabled\",\"usage\":null,\"user\":null,\"metadata\":{}}}\n\nevent: response.output_item.added\ndata: {\"type\":\"response.output_item.added\",\"output_index\":0,\"item\":{\"id\":\"msg_67c9fdcf37fc8190ba82116e33fb28c507b8b0ad4e5eb654\",\"type\":\"message\",\"status\":\"in_progress\",\"role\":\"assistant\",\"content\":[]}}\n\nevent: response.content_part.added\ndata: {\"type\":\"response.content_part.added\",\"item_id\":\"msg_67c9fdcf37fc8190ba82116e33fb28c507b8b0ad4e5eb654\",\"output_index\":0,\"content_index\":0,\"part\":{\"type\":\"output_text\",\"text\":\"\",\"annotations\":[]}}\n\nevent: response.output_text.delta\ndata: {\"type\":\"response.output_text.delta\",\"item_id\":\"msg_67c9fdcf37fc8190ba82116e33fb28c507b8b0ad4e5eb654\",\"output_index\":0,\"content_index\":0,\"delta\":\"Hi\"}\n\n...\n\nevent: response.output_text.done\ndata: {\"type\":\"response.output_text.done\",\"item_id\":\"msg_67c9fdcf37fc8190ba82116e33fb28c507b8b0ad4e5eb654\",\"output_index\":0,\"content_index\":0,\"text\":\"Hi there! How can I assist you today?\"}\n\nevent: response.content_part.done\ndata: {\"type\":\"response.content_part.done\",\"item_id\":\"msg_67c9fdcf37fc8190ba82116e33fb28c507b8b0ad4e5eb654\",\"output_index\":0,\"content_index\":0,\"part\":{\"type\":\"output_text\",\"text\":\"Hi there! How can I assist you today?\",\"annotations\":[]}}\n\nevent: response.output_item.done\ndata: {\"type\":\"response.output_item.done\",\"output_index\":0,\"item\":{\"id\":\"msg_67c9fdcf37fc8190ba82116e33fb28c507b8b0ad4e5eb654\",\"type\":\"message\",\"status\":\"completed\",\"role\":\"assistant\",\"content\":[{\"type\":\"output_text\",\"text\":\"Hi there! How can I assist you today?\",\"annotations\":[]}]}}\n\nevent: response.completed\ndata: {\"type\":\"response.completed\",\"response\":{\"id\":\"resp_67c9fdcecf488190bdd9a0409de3a1ec07b8b0ad4e5eb654\",\"object\":\"response\",\"created_at\":1741290958,\"status\":\"completed\",\"error\":null,\"incomplete_details\":null,\"instructions\":\"You are a helpful assistant.\",\"max_output_tokens\":null,\"model\":\"gpt-4.1-2025-04-14\",\"output\":[{\"id\":\"msg_67c9fdcf37fc8190ba82116e33fb28c507b8b0ad4e5eb654\",\"type\":\"message\",\"status\":\"completed\",\"role\":\"assistant\",\"content\":[{\"type\":\"output_text\",\"text\":\"Hi there! How can I assist you today?\",\"annotations\":[]}]}],\"parallel_tool_calls\":true,\"previous_response_id\":null,\"reasoning\":{\"effort\":null,\"summary\":null},\"store\":true,\"temperature\":1.0,\"text\":{\"format\":{\"type\":\"text\"}},\"tool_choice\":\"auto\",\"tools\":[],\"top_p\":1.0,\"truncation\":\"disabled\",\"usage\":{\"input_tokens\":37,\"output_tokens\":11,\"output_tokens_details\":{\"reasoning_tokens\":0},\"total_tokens\":48},\"user\":null,\"metadata\":{}}}\n" | ||
| - title: Functions | ||
| request: | ||
| curl: "curl https://api.openai.com/v1/responses \\\n -H \"Content-Type: application/json\" \\\n -H \"Authorization: Bearer $OPENAI_API_KEY\" \\\n -d '{\n \"model\": \"gpt-4.1\",\n \"input\": \"What is the weather like in Boston today?\",\n \"tools\": [\n {\n \"type\": \"function\",\n \"name\": \"get_current_weather\",\n \"description\": \"Get the current weather in a given location\",\n \"parameters\": {\n \"type\": \"object\",\n \"properties\": {\n \"location\": {\n \"type\": \"string\",\n \"description\": \"The city and state, e.g. San Francisco, CA\"\n },\n \"unit\": {\n \"type\": \"string\",\n \"enum\": [\"celsius\", \"fahrenheit\"]\n }\n },\n \"required\": [\"location\", \"unit\"]\n }\n }\n ],\n \"tool_choice\": \"auto\"\n }'\n" | ||
| python: "import os\nfrom openai import OpenAI\n\nclient = OpenAI(\n api_key=os.environ.get(\"OPENAI_API_KEY\"), # This is the default and can be omitted\n)\nresponse = client.responses.create(\n input=\"string\",\n model=\"gpt-4o\",\n)\nprint(response.id)" | ||
| python: "import os\nfrom openai import OpenAI\n\nclient = OpenAI(\n api_key=os.environ.get(\"OPENAI_API_KEY\"), # This is the default and can be omitted\n)\nresponse = client.responses.create()\nprint(response.id)" | ||
| javascript: "import OpenAI from \"openai\";\n\nconst openai = new OpenAI();\n\nconst tools = [\n {\n type: \"function\",\n name: \"get_current_weather\",\n description: \"Get the current weather in a given location\",\n parameters: {\n type: \"object\",\n properties: {\n location: {\n type: \"string\",\n description: \"The city and state, e.g. San Francisco, CA\",\n },\n unit: { type: \"string\", enum: [\"celsius\", \"fahrenheit\"] },\n },\n required: [\"location\", \"unit\"],\n },\n },\n];\n\nconst response = await openai.responses.create({\n model: \"gpt-4.1\",\n tools: tools,\n input: \"What is the weather like in Boston today?\",\n tool_choice: \"auto\",\n});\n\nconsole.log(response);\n" | ||
| csharp: "using System;\nusing OpenAI.Responses;\n\nOpenAIResponseClient client = new(\n model: \"gpt-4.1\",\n apiKey: Environment.GetEnvironmentVariable(\"OPENAI_API_KEY\")\n);\n\nResponseTool getCurrentWeatherFunctionTool = ResponseTool.CreateFunctionTool(\n functionName: \"get_current_weather\",\n functionDescription: \"Get the current weather in a given location\",\n functionParameters: BinaryData.FromString(\"\"\"\n {\n \"type\": \"object\",\n \"properties\": {\n \"location\": {\n \"type\": \"string\",\n \"description\": \"The city and state, e.g. San Francisco, CA\"\n },\n \"unit\": {\"type\": \"string\", \"enum\": [\"celsius\", \"fahrenheit\"]}\n },\n \"required\": [\"location\", \"unit\"]\n }\n \"\"\"\n )\n);\n\nstring userInputText = \"What is the weather like in Boston today?\";\n\nResponseCreationOptions options = new()\n{\n Tools =\n {\n getCurrentWeatherFunctionTool\n },\n ToolChoice = ResponseToolChoice.CreateAutoChoice(),\n};\n\nOpenAIResponse response = client.CreateResponse(userInputText, options);\n" | ||
| node.js: "import OpenAI from 'openai';\n\nconst client = new OpenAI({\n apiKey: process.env['OPENAI_API_KEY'], // This is the default and can be omitted\n});\n\nconst response = await client.responses.create({ input: 'string', model: 'gpt-4o' });\n\nconsole.log(response.id);" | ||
| go: "package main\n\nimport (\n \"context\"\n \"fmt\"\n\n \"github.com/openai/openai-go\"\n \"github.com/openai/openai-go/option\"\n \"github.com/openai/openai-go/responses\"\n \"github.com/openai/openai-go/shared\"\n)\n\nfunc main() {\n client := openai.NewClient(\n option.WithAPIKey(\"My API Key\"), // defaults to os.LookupEnv(\"OPENAI_API_KEY\")\n )\n response, err := client.Responses.New(context.TODO(), responses.ResponseNewParams{\n Input: responses.ResponseNewParamsInputUnion{\n OfString: openai.String(\"string\"),\n },\n Model: shared.ResponsesModel(\"gpt-4o\"),\n })\n if err != nil {\n panic(err.Error())\n }\n fmt.Printf(\"%+v\\n\", response.ID)\n}\n" | ||
| java: "package com.openai.example;\n\nimport com.openai.client.OpenAIClient;\nimport com.openai.client.okhttp.OpenAIOkHttpClient;\nimport com.openai.models.ChatModel;\nimport com.openai.models.responses.Response;\nimport com.openai.models.responses.ResponseCreateParams;\n\npublic final class Main {\n private Main() {}\n\n public static void main(String[] args) {\n // Configures using the `OPENAI_API_KEY`, `OPENAI_ORG_ID`, `OPENAI_PROJECT_ID` and `OPENAI_BASE_URL` environment variables\n OpenAIClient client = OpenAIOkHttpClient.fromEnv();\n\n ResponseCreateParams params = ResponseCreateParams.builder()\n .input(\"string\")\n .model(ChatModel.GPT_4O)\n .build();\n Response response = client.responses().create(params);\n }\n}" | ||
| kotlin: "package com.openai.example\n\nimport com.openai.client.OpenAIClient\nimport com.openai.client.okhttp.OpenAIOkHttpClient\nimport com.openai.models.ChatModel\nimport com.openai.models.responses.Response\nimport com.openai.models.responses.ResponseCreateParams\n\nfun main() {\n // Configures using the `OPENAI_API_KEY`, `OPENAI_ORG_ID`, `OPENAI_PROJECT_ID` and `OPENAI_BASE_URL` environment variables\n val client: OpenAIClient = OpenAIOkHttpClient.fromEnv()\n\n val params: ResponseCreateParams = ResponseCreateParams.builder()\n .input(\"string\")\n .model(ChatModel.GPT_4O)\n .build()\n val response: Response = client.responses().create(params)\n}" | ||
| ruby: "require \"openai\"\n\nopenai = OpenAI::Client.new(\n api_key: ENV[\"OPENAI_API_KEY\"] # This is the default and can be omitted\n)\n\nresponse = openai.responses.create(input: \"string\", model: :\"gpt-4o\")\n\nputs(response)" | ||
| node.js: "import OpenAI from 'openai';\n\nconst client = new OpenAI({\n apiKey: process.env['OPENAI_API_KEY'], // This is the default and can be omitted\n});\n\nconst response = await client.responses.create();\n\nconsole.log(response.id);" | ||
| go: "package main\n\nimport (\n \"context\"\n \"fmt\"\n\n \"github.com/openai/openai-go\"\n \"github.com/openai/openai-go/option\"\n \"github.com/openai/openai-go/responses\"\n)\n\nfunc main() {\n client := openai.NewClient(\n option.WithAPIKey(\"My API Key\"), // defaults to os.LookupEnv(\"OPENAI_API_KEY\")\n )\n response, err := client.Responses.New(context.TODO(), responses.ResponseNewParams{\n\n })\n if err != nil {\n panic(err.Error())\n }\n fmt.Printf(\"%+v\\n\", response.ID)\n}\n" | ||
| java: "package com.openai.example;\n\nimport com.openai.client.OpenAIClient;\nimport com.openai.client.okhttp.OpenAIOkHttpClient;\nimport com.openai.models.responses.Response;\nimport com.openai.models.responses.ResponseCreateParams;\n\npublic final class Main {\n private Main() {}\n\n public static void main(String[] args) {\n // Configures using the `OPENAI_API_KEY`, `OPENAI_ORG_ID`, `OPENAI_PROJECT_ID` and `OPENAI_BASE_URL` environment variables\n OpenAIClient client = OpenAIOkHttpClient.fromEnv();\n\n Response response = client.responses().create();\n }\n}" | ||
| kotlin: "package com.openai.example\n\nimport com.openai.client.OpenAIClient\nimport com.openai.client.okhttp.OpenAIOkHttpClient\nimport com.openai.models.responses.Response\nimport com.openai.models.responses.ResponseCreateParams\n\nfun main() {\n // Configures using the `OPENAI_API_KEY`, `OPENAI_ORG_ID`, `OPENAI_PROJECT_ID` and `OPENAI_BASE_URL` environment variables\n val client: OpenAIClient = OpenAIOkHttpClient.fromEnv()\n\n val response: Response = client.responses().create()\n}" | ||
| ruby: "require \"openai\"\n\nopenai = OpenAI::Client.new(\n api_key: ENV[\"OPENAI_API_KEY\"] # This is the default and can be omitted\n)\n\nresponse = openai.responses.create\n\nputs(response)" | ||
| response: "{\n \"id\": \"resp_67ca09c5efe0819096d0511c92b8c890096610f474011cc0\",\n \"object\": \"response\",\n \"created_at\": 1741294021,\n \"status\": \"completed\",\n \"error\": null,\n \"incomplete_details\": null,\n \"instructions\": null,\n \"max_output_tokens\": null,\n \"model\": \"gpt-4.1-2025-04-14\",\n \"output\": [\n {\n \"type\": \"function_call\",\n \"id\": \"fc_67ca09c6bedc8190a7abfec07b1a1332096610f474011cc0\",\n \"call_id\": \"call_unLAR8MvFNptuiZK6K6HCy5k\",\n \"name\": \"get_current_weather\",\n \"arguments\": \"{\\\"location\\\":\\\"Boston, MA\\\",\\\"unit\\\":\\\"celsius\\\"}\",\n \"status\": \"completed\"\n }\n ],\n \"parallel_tool_calls\": true,\n \"previous_response_id\": null,\n \"reasoning\": {\n \"effort\": null,\n \"summary\": null\n },\n \"store\": true,\n \"temperature\": 1.0,\n \"text\": {\n \"format\": {\n \"type\": \"text\"\n }\n },\n \"tool_choice\": \"auto\",\n \"tools\": [\n {\n \"type\": \"function\",\n \"description\": \"Get the current weather in a given location\",\n \"name\": \"get_current_weather\",\n \"parameters\": {\n \"type\": \"object\",\n \"properties\": {\n \"location\": {\n \"type\": \"string\",\n \"description\": \"The city and state, e.g. San Francisco, CA\"\n },\n \"unit\": {\n \"type\": \"string\",\n \"enum\": [\n \"celsius\",\n \"fahrenheit\"\n ]\n }\n },\n \"required\": [\n \"location\",\n \"unit\"\n ]\n },\n \"strict\": true\n }\n ],\n \"top_p\": 1.0,\n \"truncation\": \"disabled\",\n \"usage\": {\n \"input_tokens\": 291,\n \"output_tokens\": 23,\n \"output_tokens_details\": {\n \"reasoning_tokens\": 0\n },\n \"total_tokens\": 314\n },\n \"user\": null,\n \"metadata\": {}\n}\n" | ||
| - title: Reasoning | ||
| request: | ||
| curl: "curl https://api.openai.com/v1/responses \\\n -H \"Content-Type: application/json\" \\\n -H \"Authorization: Bearer $OPENAI_API_KEY\" \\\n -d '{\n \"model\": \"o3-mini\",\n \"input\": \"How much wood would a woodchuck chuck?\",\n \"reasoning\": {\n \"effort\": \"high\"\n }\n }'\n" | ||
| javascript: "import OpenAI from \"openai\";\nconst openai = new OpenAI();\n\nconst response = await openai.responses.create({\n model: \"o3-mini\",\n input: \"How much wood would a woodchuck chuck?\",\n reasoning: {\n effort: \"high\"\n }\n});\n\nconsole.log(response);\n" | ||
| python: "import os\nfrom openai import OpenAI\n\nclient = OpenAI(\n api_key=os.environ.get(\"OPENAI_API_KEY\"), # This is the default and can be omitted\n)\nresponse = client.responses.create(\n input=\"string\",\n model=\"gpt-4o\",\n)\nprint(response.id)" | ||
| python: "import os\nfrom openai import OpenAI\n\nclient = OpenAI(\n api_key=os.environ.get(\"OPENAI_API_KEY\"), # This is the default and can be omitted\n)\nresponse = client.responses.create()\nprint(response.id)" | ||
| csharp: "using System;\nusing OpenAI.Responses;\n\nOpenAIResponseClient client = new(\n model: \"o3-mini\",\n apiKey: Environment.GetEnvironmentVariable(\"OPENAI_API_KEY\")\n);\n\nstring userInputText = \"How much wood would a woodchuck chuck?\";\n\nResponseCreationOptions options = new()\n{\n ReasoningOptions = new()\n {\n ReasoningEffortLevel = ResponseReasoningEffortLevel.High,\n },\n};\n\nOpenAIResponse response = client.CreateResponse(userInputText, options);\n\nConsole.WriteLine(response.GetOutputText());\n" | ||
| node.js: "import OpenAI from 'openai';\n\nconst client = new OpenAI({\n apiKey: process.env['OPENAI_API_KEY'], // This is the default and can be omitted\n});\n\nconst response = await client.responses.create({ input: 'string', model: 'gpt-4o' });\n\nconsole.log(response.id);" | ||
| go: "package main\n\nimport (\n \"context\"\n \"fmt\"\n\n \"github.com/openai/openai-go\"\n \"github.com/openai/openai-go/option\"\n \"github.com/openai/openai-go/responses\"\n \"github.com/openai/openai-go/shared\"\n)\n\nfunc main() {\n client := openai.NewClient(\n option.WithAPIKey(\"My API Key\"), // defaults to os.LookupEnv(\"OPENAI_API_KEY\")\n )\n response, err := client.Responses.New(context.TODO(), responses.ResponseNewParams{\n Input: responses.ResponseNewParamsInputUnion{\n OfString: openai.String(\"string\"),\n },\n Model: shared.ResponsesModel(\"gpt-4o\"),\n })\n if err != nil {\n panic(err.Error())\n }\n fmt.Printf(\"%+v\\n\", response.ID)\n}\n" | ||
| java: "package com.openai.example;\n\nimport com.openai.client.OpenAIClient;\nimport com.openai.client.okhttp.OpenAIOkHttpClient;\nimport com.openai.models.ChatModel;\nimport com.openai.models.responses.Response;\nimport com.openai.models.responses.ResponseCreateParams;\n\npublic final class Main {\n private Main() {}\n\n public static void main(String[] args) {\n // Configures using the `OPENAI_API_KEY`, `OPENAI_ORG_ID`, `OPENAI_PROJECT_ID` and `OPENAI_BASE_URL` environment variables\n OpenAIClient client = OpenAIOkHttpClient.fromEnv();\n\n ResponseCreateParams params = ResponseCreateParams.builder()\n .input(\"string\")\n .model(ChatModel.GPT_4O)\n .build();\n Response response = client.responses().create(params);\n }\n}" | ||
| kotlin: "package com.openai.example\n\nimport com.openai.client.OpenAIClient\nimport com.openai.client.okhttp.OpenAIOkHttpClient\nimport com.openai.models.ChatModel\nimport com.openai.models.responses.Response\nimport com.openai.models.responses.ResponseCreateParams\n\nfun main() {\n // Configures using the `OPENAI_API_KEY`, `OPENAI_ORG_ID`, `OPENAI_PROJECT_ID` and `OPENAI_BASE_URL` environment variables\n val client: OpenAIClient = OpenAIOkHttpClient.fromEnv()\n\n val params: ResponseCreateParams = ResponseCreateParams.builder()\n .input(\"string\")\n .model(ChatModel.GPT_4O)\n .build()\n val response: Response = client.responses().create(params)\n}" | ||
| ruby: "require \"openai\"\n\nopenai = OpenAI::Client.new(\n api_key: ENV[\"OPENAI_API_KEY\"] # This is the default and can be omitted\n)\n\nresponse = openai.responses.create(input: \"string\", model: :\"gpt-4o\")\n\nputs(response)" | ||
| node.js: "import OpenAI from 'openai';\n\nconst client = new OpenAI({\n apiKey: process.env['OPENAI_API_KEY'], // This is the default and can be omitted\n});\n\nconst response = await client.responses.create();\n\nconsole.log(response.id);" | ||
| go: "package main\n\nimport (\n \"context\"\n \"fmt\"\n\n \"github.com/openai/openai-go\"\n \"github.com/openai/openai-go/option\"\n \"github.com/openai/openai-go/responses\"\n)\n\nfunc main() {\n client := openai.NewClient(\n option.WithAPIKey(\"My API Key\"), // defaults to os.LookupEnv(\"OPENAI_API_KEY\")\n )\n response, err := client.Responses.New(context.TODO(), responses.ResponseNewParams{\n\n })\n if err != nil {\n panic(err.Error())\n }\n fmt.Printf(\"%+v\\n\", response.ID)\n}\n" | ||
| java: "package com.openai.example;\n\nimport com.openai.client.OpenAIClient;\nimport com.openai.client.okhttp.OpenAIOkHttpClient;\nimport com.openai.models.responses.Response;\nimport com.openai.models.responses.ResponseCreateParams;\n\npublic final class Main {\n private Main() {}\n\n public static void main(String[] args) {\n // Configures using the `OPENAI_API_KEY`, `OPENAI_ORG_ID`, `OPENAI_PROJECT_ID` and `OPENAI_BASE_URL` environment variables\n OpenAIClient client = OpenAIOkHttpClient.fromEnv();\n\n Response response = client.responses().create();\n }\n}" | ||
| kotlin: "package com.openai.example\n\nimport com.openai.client.OpenAIClient\nimport com.openai.client.okhttp.OpenAIOkHttpClient\nimport com.openai.models.responses.Response\nimport com.openai.models.responses.ResponseCreateParams\n\nfun main() {\n // Configures using the `OPENAI_API_KEY`, `OPENAI_ORG_ID`, `OPENAI_PROJECT_ID` and `OPENAI_BASE_URL` environment variables\n val client: OpenAIClient = OpenAIOkHttpClient.fromEnv()\n\n val response: Response = client.responses().create()\n}" | ||
| ruby: "require \"openai\"\n\nopenai = OpenAI::Client.new(\n api_key: ENV[\"OPENAI_API_KEY\"] # This is the default and can be omitted\n)\n\nresponse = openai.responses.create\n\nputs(response)" | ||
| response: "{\n \"id\": \"resp_67ccd7eca01881908ff0b5146584e408072912b2993db808\",\n \"object\": \"response\",\n \"created_at\": 1741477868,\n \"status\": \"completed\",\n \"error\": null,\n \"incomplete_details\": null,\n \"instructions\": null,\n \"max_output_tokens\": null,\n \"model\": \"o1-2024-12-17\",\n \"output\": [\n {\n \"type\": \"message\",\n \"id\": \"msg_67ccd7f7b5848190a6f3e95d809f6b44072912b2993db808\",\n \"status\": \"completed\",\n \"role\": \"assistant\",\n \"content\": [\n {\n \"type\": \"output_text\",\n \"text\": \"The classic tongue twister...\",\n \"annotations\": []\n }\n ]\n }\n ],\n \"parallel_tool_calls\": true,\n \"previous_response_id\": null,\n \"reasoning\": {\n \"effort\": \"high\",\n \"summary\": null\n },\n \"store\": true,\n \"temperature\": 1.0,\n \"text\": {\n \"format\": {\n \"type\": \"text\"\n }\n },\n \"tool_choice\": \"auto\",\n \"tools\": [],\n \"top_p\": 1.0,\n \"truncation\": \"disabled\",\n \"usage\": {\n \"input_tokens\": 81,\n \"input_tokens_details\": {\n \"cached_tokens\": 0\n },\n \"output_tokens\": 1035,\n \"output_tokens_details\": {\n \"reasoning_tokens\": 832\n },\n \"total_tokens\": 1116\n },\n \"user\": null,\n \"metadata\": {}\n}\n" | ||
| '/responses/{response_id}': |
There was a problem hiding this comment.
🛠️ Refactor suggestion
Unify language examples and clarify default behavior for omitted parameters
Examples for Python, Node.js, Go, Java, Kotlin, and Ruby now omit model and input, while cURL, JavaScript, and C# snippets still include them. This inconsistency may confuse users. Please:
- Align all examples to either the minimal pattern (no args) or provide both “Minimal” and “Full” variants.
- Add a note explaining what defaults are used when
modelorinputare omitted.
🤖 Prompt for AI Agents
In src/libs/tryAGI.OpenAI/openapi.yaml between lines 5262 and 5345, the code
examples for different languages are inconsistent in including the 'model' and
'input' parameters, which can confuse users. To fix this, unify all language
examples by either removing 'model' and 'input' from all or providing two
variants for each: a "Minimal" example without these parameters and a "Full"
example including them. Additionally, add a clear note explaining the default
values used when 'model' or 'input' are omitted to clarify the behavior for
users.
Summary by CodeRabbit